AI Course Policies and Syllabus Language
Clear options and copy-ready language to define how generative AI may be used in your course.
Course PoliciesArtificial intelligence is now part of the conditions of teaching and learning. Some faculty will restrict student AI use; others will integrate it intentionally. Either approach can be rigorous and defensible when expectations are clear and assessments measure what students actually learn.
You do not need to be an AI expert to teach well in an AI-present environment. Start by setting expectations, then use assessment design strategies that make student thinking visible.
Clear options and copy-ready language to define how generative AI may be used in your course.
Course PoliciesGuidance for designing assessments that still measure learning and support academic integrity in an AI-present environment.
AssessmentGet individualized support with AI policies, assessment design, or instructional questions related to your course.
SupportMost instructors take one of three practical approaches to generative AI. Any of these approaches can be effective when expectations are clear and reinforced at the assignment level.
Choose the option that best matches your course goals and comfort level. You can adjust over time.
This approach limits student use of generative AI except where it is explicitly required or permitted.
AI is prohibited unless explicitly allowed for a specific task
Assignment instructions clearly state what is and is not permitted
Major work includes process evidence such as drafts, reflections, or brief verification
Best fit if your course emphasizes foundational skills, individual practice, or early-stage learning.
This approach allows or encourages AI use as part of learning, with clear expectations and accountability.
AI use is permitted with required disclosure
Students explain how tools were used and what they changed or decided
Assessment emphasizes reasoning, decision-making, and evaluation of sources
Best fit if your course emphasizes analysis, synthesis, critique, or real-world application.
A mixed approach provides flexibility while you learn what works best for your course.
Mixed policies reduce ambiguity when implemented clearly
AI allowance is decided by assignment type (practice, draft, final, exam)
Expectations can be revisited after the first major assessment
Best fit if you want to start cautiously and refine your approach as the semester progresses.
Once you identify your starting point, you can take the practical next steps.
If you use generative AI in your teaching practice, treat it as support for planning and drafting, not as an authority. Verify content, check for bias, and do not enter sensitive information.
Example prompt: Generate three diverse scenarios that illustrate ethos, pathos, and logos. For each scenario, include one discussion question that requires students to apply the concept.
Example prompt: Explain a “nudge” within the realm of behavioral economics, doing so for students new to this topic. Provide one everyday example and one discipline-specific example. End with three self-check questions.
Example prompt: Present social learning theory in three formats: a short explanation, a worked example, and a common misconception with a correction.
Example prompt: Review the assignment description below. Identify where students may misunderstand expectations and suggest clearer language. Recommend a rubric outline aligned to the learning outcomes. (Include copy-and-pasted description following prompt.)
The growth of generative AI presents real challenges for academic integrity. At MSU Denver, however, the decision has been made not to adopt an enterprise-level AI detection tool due to well-documented methodological, ethical, and procedural limitations associated with these technologies.
Reliance on AI detection tools raises significant concerns that limit their usefulness in academic contexts:
Rather than relying on surveillance-based tools, MSU Denver emphasizes a proactive, human-centered approach grounded in clear expectations, transparency, and intentional assessment design. This approach prioritizes student learning, preserves trust, and provides faculty with more reliable ways to evaluate student understanding.
Guidance on AI and Assessment
Faculty are encouraged to consult the AI and Assessment resource page for practical guidance on setting clear expectations for AI use, designing assessments aligned with learning outcomes, and responding appropriately when questions arise about student work. The resource includes concrete examples, assessment strategies, and recommended practices that are pedagogically sound and institutionally defensible.
Institutional Note on Privacy and Security: Submitting student work to third-party AI detection platforms may introduce security and privacy risks. Many tools lack transparency about data storage, secondary use, or whether submitted content may be incorporated into commercial datasets. Faculty should exercise caution and prioritize approaches that protect student data and comply with university data and privacy standards.
Policy and Syllabus Guidance
Assessment and Teaching Guidance